doi: 10.17586/2226-1494-2023-23-5-1009-1020


System for customers’ routing based on their emotional state and age in public services systems 

G. Soma, G. D. Kopanitsa


Read the full article  ';
Article in English

For citation:
Soma G.M., Kopanitsa G.D. System for customers’ routing based on their emotional state and age in public services systems. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2023, vol. 23, no. 5, pp. 1009–1020. doi: 10.17586/2226-1494-2023-23-5-1009-1020


Abstract
In this paper, we have developed a system for assigning customers to the routes based on their emotional state and age in Public Service Systems (PSSs). The Squeeze-and-Excitation (SE) method was used to develop the models, it improves the efficiency of the Deep Convolutional Neural Networks (DCNN) architecture by increasing the information flow between layers and enhancing important features. The method is based on compressing and exciting information at each convolution stage, which allows obtaining a vector of channel importance scores and using it to reweight the channels of the feature map. The study showed that this method allowed improving the quality of classification and reducing the model training time. The model of emotional target routing was developed based on the Newton interpolation polynomial to route customers based on their emotional state and age. The interpolation function in this model calculates the waiting time for customers according to their emotional state. Three models of binary classification of emotions and ages were developed, namely, two models for recognizing the emotional state of the customer, and one model for recognizing their age. The first and third models utilize DCNN from scratch using the new SE approach based on the attention mechanism. The second model uses the Support Vector Machine (SVM) method. The evaluate method was used to test the model after training, which allows evaluating the quality of the model on new data that was not used during training. This is done to check how accurately the model can predict the values of the target variable on new data. The evaluate method utilizes quality evaluation metrics such as accuracy, recall, and F1-score to assess the performance of the model. According to the experimental data obtained, the first and the second developed models achieved the validation accuracy of 72 % and 66 %, respectively, on the FER-2013 and Adience datasets. Their sizes were 0.69 MB and 369 MB, respectively. At the same time, the age recognition model achieved the accuracy of 88 % with the size of 1.68 MB. The mathematical model of emotional target routing (TERSS) was developed to minimize conflicts in public service systems. The developed system can automatically route customers based on their emotional state (presence of anger) and age to the appropriate operator. Thus, customers over 60 years old or with the anger level of 60–80 % are directed to a senior operator who knows how to communicate with elderly or emotionally excited customers, while customers with the anger level of 80–100 % are directed to a psychologist. This research can be applied in PSSs to detect the features of customers’ age and anger. Moreover, it can be applied in various areas where there is a contact with a large number of people, such as banks, supermarkets, airports access control systems, police stations, subways, and call centers.

Keywords: facial expression, emotion, age, classification, DCNN, PSS

References
  1. Adigwe P., Okoro E. Human communication and effective interpersonal relationships: an analysis of client counseling and emotional stability. International Journal of Economics & Management Sciences, 2016, vol. 5, no. 3. https://doi.org/10.4172/2162-6359.1000336
  2. Giangreco A., Carugati A., Sebastiano A., Al Tamimi H. War outside, ceasefire inside: An analysis of the performance appraisal system of a public hospital in a zone of conflict. Evaluation and Program Planning, 2012,vol. 35, no. 1, pp. 161–170. https://doi.org/10.1016/j.evalprogplan.2010.11.004
  3. Knutson B. Facial expressions of emotion influence interpersonal trait inferences. Journal of Nonverbal Behavior, 1996, vol. 20, no. 3, pp. 165–182. https://doi.org/10.1007/bf02281954
  4. Compas B.E. Psychobiological processes of stress and coping: implications for resilience in children and adolescents—comments on the papers of Romeo & McEwen and Fisher et al. Annals of the New York Academy of Sciences,2006, vol. 1094, no. 1,pp. 226–234. https://doi.org/10.1196/annals.1376.024
  5. Zapf D. Emotion work and psychological well-being: A review of the literature and some conceptual considerations. Human Resource Management Review, 2002, vol. 12, no. 2, pp. 237–268. https://doi.org/10.1016/s1053-4822(02)00048-7
  6. Dong Y., Liu Y., Lian S. Automatic age estimation based on deep learning algorithm. Neurocomputing, 2016, vol. 187, pp. 4–10. https://doi.org/10.1016/j.neucom.2015.09.115
  7. Giannakakis G., Koujan M.R., Roussos A., Marias K. Automatic stress analysis from facial videos based on deep facial action units recognition. Pattern Analysis and Applications, 2022, vol. 25, no. 3, pp. 521–535. https://doi.org/10.1007/s10044-021-01012-9
  8. Gwyn T., Roy K., Atay M. Face recognition using popular deep net architectures: A brief comparative study. Future Internet, 2021, vol. 13, no. 7, pp. 164. https://doi.org/10.3390/fi13070164
  9. Kumar S., Singh S., Kumar J., Prasad K.M.V.V. Age and gender classification using Seg-Net based architecture and machine learning. Multimedia Tools and Applications, 2022, vol. 8, no. 29, pp. 42285–42308. https://doi.org/10.1007/s11042-021-11499-3
  10. Leist A., Playne D.P., Hawick K.A. Exploiting graphical processing units for data-parallel scientific applications. Concurrency and Computation: Practice and Experience,2009, vol. 21, no. 18, pp. 2400–2437. https://doi.org/10.1002/cpe.1462
  11. Madhavi M., Gujar I., Jadhao V., Gulwani R. Facial emotion classifier using convolutional neural networks for reaction review. ITM Web of Conferences, 2022, vol. 44, pp. 03055. https://doi.org/10.1051/itmconf/20224403055
  12. Vdovina M.V. The regulation of conflict between social worker and recipient of social services. Wschodnioeuropejskie Czasopismo Naukowe (East European Scientific Journal), 2015, vol. 3, no. 3, pp. 124–131. (in Russian)
  13. Mouatasim A.E. Fast gradient descent algorithm for image classification with neural networks. Signal, Image and Video Processing, 2020, vol. 14, no. 8, pp. 1565–1572. https://doi.org/10.1007/s11760-020-01696-2
  14. Nguyen H.-D., Kim S.-H., Lee G.-S., Yang H.-J., Na I.-S., Kim S.-H. Facial expression recognition using a temporal ensemble of multi-level convolutional neural networks. IEEE Transactions on Affective Computing, 2022, vol. 13, no. 1, pp. 226–237, https://doi.org/10.1109/TAFFC.2019.2946540
  15. Nikishina I.Y. The expression of «anger» concept in the modern English and American fiction. Extended abstract of Candidate of Philological Sciences Dissertation. Moscow, 2008. 23 p. (in Russian).
  16. Olaide O.B., Ojo A.K. A model for conflicts’ prediction using deep neural network. International Journal of Computer Applications, 2021, vol. 183, no. 29, pp. 8–13. https://doi.org/10.5120/ijca2021921667
  17. Paper D. Classification from complex training sets. Hands-on Scikit-Learn for Machine Learning Applications. Apress, Berkeley, CA, 2020, pp. 71–104. https://doi.org/10.1007/978-1-4842-5373-1_3
  18. Piatak J., Romzek B., LeRoux K., Johnston J. Managing goal conflict in public service delivery networks: Does accountability move up and down, or side to side? Public Performance & Management Review, 2018, vol. 41, no. 1, pp. 152–176. https://doi.org/10.1080/15309576.2017.1400993
  19. Pollak S.D., Camras L.A., Cole P.M. Progress in understanding the emergence of human emotion. Developmental Psychology, 2019, vol. 55, no. 9, pp. 1801–1811. https://doi.org/10.1037/dev0000789
  20. Reichel L. Newton interpolation at Leja points. BIT, 1990, vol. 30, no. 2, pp. 332–346. https://doi.org/10.1007/BF02017352
  21. Rodriques M.V. Perspectives of Communication and Communicative Competence. Concept Publishing Company, 2000, 390 p.
  22. Ryumina E.V., Karpov A.A. Analytical review of methods for emotion recognition by human face expressions. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2020, vol. 20, no. 2, pp. 163–176 (in Russian). https://doi.org/10.17586/2226-1494-2020-20-2-163-176
  23. El-Glaly Y.N., Quek F. Digital reading support for the blind by multimodal interaction. ICMI '14: Proc. of the 16th International Conference on Multimodal Interaction, 2014, pp. 439–446. https://doi.org/10.1145/2663204.2663266
  24. Hameduddin T., Engbers T. Leadership and public service motivation: a systematic synthesis. International Public Management Journal, 2022, vol. 25, no. 1, pp. 86–119. https://doi.org/10.1080/10967494.2021.1884150
  25. Thrassou A., Santoro G., Leonidou E., Vrontis D., Christofi M. Emotional intelligence and perceived negative emotions in intercultural service encounters: Building and utilizing knowledge in the banking sector. European Business Review, 2020, vol. 32, no. 3, pp. 359–381. https://doi.org/10.1108/ebr-04-2019-0059
  26. Varma S., Shinde M., Chavan S.S. Analysis of PCA and LDA features for facial expression recognition using SVM and HMM classifiers. Techno-Societal 2018: Proc. of the 2nd International Conference on Advanced Technologies for Societal Applications. V. 1, 2020, pp. 109–119. https://doi.org/10.1007/978-3-030-16848-3_11
  27. Petri V., Jari K. Public service systems and emerging systemic governance challenges. International Journal of Public Leadership, 2015, vol. 11, no. 2, pp. 77–91. https://doi.org/10.1108/IJPL-02-2015-0007
  28. Wu Z., Shen C., van den Hengel A. Wider or deeper: Revisiting the ResNet model for visual recognition. Pattern Recognition, 2019, vol. 90, pp. 119–133. https://doi.org/10.1016/j.patcog.2019.01.006
  29. Tian Y. Evaluation of face resolution for expression analysis. Proc. of the 2004 Conference on Computer Vision and Pattern Recognition Workshop, 2004, pp. 82. https://doi.org/10.1109/CVPR.2004.334
  30. Zaghbani S., Bouhlel M.S. Multi-task CNN for multi-cue affects recognition using upper-body gestures and facial expressions. International Journal of Information Technology, 2022, vol. 14, no. 1, pp. 531–538. https://doi.org/10.1007/s41870-021-00820-w


Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Copyright 2001-2024 ©
Scientific and Technical Journal
of Information Technologies, Mechanics and Optics.
All rights reserved.

Яндекс.Метрика